90 research outputs found

    Optic flow based autopilot: From insects to rotorcraft and back

    Get PDF
    International audienceWhen insects are flying forwards, the image of the ground sweeps backwards across their ventral viewfield, forming an "optic flow", which depends on both the groundspeed and the height of flight. To explain how these animals manage to avoid the ground using this image motion cue, we suggest that insect navigation hinges on a visual feedback loop we have called the optic flow regulator, which controls the vertical lift. To test this idea, we built a micro-helicopter equipped with a fly-inspired optic flow sensor and an optic flow regulator. We showed that this fly-by-sight microrobot can perform exacting tasks such as takeoff , level flight and landing. Our control scheme accounts for many hitherto unexplained findings published during the last 70 years on insects' visually guided performances, including the facts that honeybees descend under headwind conditions, land with a constant slope and drown when travelling over mirror-smooth water. Our control scheme explains how insects manage to fly safely without any of the instruments used onboard aircraft to measure the height of flight, the airspeed, the groundspeed, and the descent speed. An optic flow regulator could be easily implemented neurally. It is just as appropriate for insects (1) as it would be for aircraft (2,3)

    Réguler le flux optique latéral pour naviguer dans un corridor

    Get PDF
    International audienceAs a first step toward an Automatic Flight Control System (AFCS) for Micro-Air Vehicle (MAV) obstacle avoidance, we introduce a vision based autopilot (LORA: Lateral Optic flow Regulation Autopilot), which is able to make a hovercraft automatically follow a wall or centre between the two walls of a corridor. A hovercraft is endowed with natural stabilization in pitch and roll while keeping two translational degrees of freedom (X and Y) and one rotational degree of freedom (yaw). We show the feasibility of an OF regulator that maintains the lateral Optic Flow (OF) on one wall equal to an OF set-point. The OF sensors used are Elementary Motion Detectors (EMDs), whose working was directly inspired by the housefly motion detecting neurons. The properties of these neurons were previously analysed at our laboratory by performing electrophysiological recordings while applying optical microstimuli to single photoreceptor cells of the compound eye. The simulation results show that depending on the OF set-point, the hovercraft either centres along the midline of the corridor or follows one of the two walls, even with local lack of optical texture on one wall, such as caused, for instance, by an open door or a T-junction. All these navigational tasks are performed with one and the same feedback loop, which consists of a lateral OF regulation loop that permits relatively high-speed navigation (1m/s, i.e 3 body lengths per second), with a minimalist visual system (only two EMDs, each EMD uses two pixels). This principle contrasts with the formerly proposed strategy that consists in equalizing the two lateral OFs. The passive visual sensors and the simple processing system are suitable for use with MAVs with an avionic payload of only a few grams. The goal is to achieve MAV automatic guidance or to relieve a remote operator from guiding it in challenging environments such as urban canyons or indoor environments

    Event-based visual guidance inspired by honeybees in a 3D tapered tunnel

    No full text
    International audience— In view of neuro-ethological findings on honeybees and our previously developed vision-based autopilot, in-silico experiments were performed in which a " simulated bee " was make to travel along a doubly tapering tunnel including for the first time event-based controllers. The " simulated bee " was equipped with: • a minimalistic compound eye comprising 10 local motion sensors measuring the optic flow magnitude, • two optic flow regulators updating the control signals whenever specific optic flow criteria changed, • and three event-based controllers taking into account the error signals, each one in charge of its own translational dynamics. A MORSE/Blender based simulator-engine delivered what each of 20 " simulated photoreceptors " saw in the tunnel lined with high resolution natural 2D images. The " simulated bee " managed to travel safely along the doubly tapering tunnel without requiring any speed or distance measurements, using only a Gibsonian point of view, by: • concomitantly adjusting the side thrust, vertical lift and forward thrust whenever a change was detected on the optic flow-based signal errors, • avoiding collisions with the surface of the doubly tapering tunnel and decreasing or increasing its speed, depending on the clutter rate perceived by motion sensors

    Flying in 3D with an Insect based Visual Autopilot

    Get PDF
    International audienceFlying insects rely on Optic Flow (OF) cues to avoid collisions, control their speed, control their height, and land. Recent studies have shown that the principle of “OF regulation” may account for various behaviors observed in freely flying insects. The aim of the present study was to suggest a visually guided autopilot enabling an insect to navigate in 3D, and to test its robustness to natural images. Using computer-simulation experiments, we simulated a bee that flies through a tunnel wallpapered with natural images, by controlling both its ground speed and clearance all four sides: the lateral walls, the ground, and the ceiling. The simulated bee can translate along three directions (the surge, sway, and heave axes): it is therefore fully actuated. The new visuo-motor control system, called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops: the speed control loop (along the surge axis) and the positioning control loop (along both the sway and heave axes), each of which has its own OF set-point. The experiments show that the simulated bee navigates safely along a straight tunnel, while compensating for the major OF perturbations caused by, e.g., a tapering of the tunnel or the lack of texture on one wall. The minimalistic visual system used here (only eight pixels) is robust to naturally contrasted stimuli and tunnels, and is sufficient to control both the clearance from the four sides and the forward speed jointly, without requiring to measure any speeds or distances. Besides, the ALIS autopilot accounts remarkably for the quantitative results of ethological experiments performed on honeybees flying freely in straight or tapered corridors

    Honeybees' Speed Depends on Dorsal as Well as Lateral, Ventral and Frontal Optic Flows

    Get PDF
    Flying insects use the optic flow to navigate safely in unfamiliar environments, especially by adjusting their speed and their clearance from surrounding objects. It has not yet been established, however, which specific parts of the optical flow field insects use to control their speed. With a view to answering this question, freely flying honeybees were trained to fly along a specially designed tunnel including two successive tapering parts: the first part was tapered in the vertical plane and the second one, in the horizontal plane. The honeybees were found to adjust their speed on the basis of the optic flow they perceived not only in the lateral and ventral parts of their visual field, but also in the dorsal part. More specifically, the honeybees' speed varied monotonically, depending on the minimum cross-section of the tunnel, regardless of whether the narrowing occurred in the horizontal or vertical plane. The honeybees' speed decreased or increased whenever the minimum cross-section decreased or increased. In other words, the larger sum of the two opposite optic flows in the horizontal and vertical planes was kept practically constant thanks to the speed control performed by the honeybees upon encountering a narrowing of the tunnel. The previously described ALIS (“AutopiLot using an Insect-based vision System”) model nicely matches the present behavioral findings. The ALIS model is based on a feedback control scheme that explains how honeybees may keep their speed proportional to the minimum local cross-section of a tunnel, based solely on optic flow processing, without any need for speedometers or rangefinders. The present behavioral findings suggest how flying insects may succeed in adjusting their speed in their complex foraging environments, while at the same time adjusting their distance not only from lateral and ventral objects but also from those located in their dorsal visual field

    Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

    No full text
    International audienceFor the last few decades, growing interest has returned to the quite chal-lenging task of the autonomous lunar landing. Soft landing of payloads on the lu-nar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an au-topilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accu-rately estimate the orientation of the velocity vector which is mandatory to control the lander's pitch in a quasi-optimal way with respect to the fuel consumption. Sec-ondly a new low-speed Visual Motion Sensor (VMS) inspired by insects' visual systems performing local angular 1-D speed measurements ranging from 1.5 • /s to 25 • /s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These pre-liminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow

    Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review

    Get PDF
    Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modelling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research of insects' visual systems in the literature. These motion perception models or neural networks comprise the looming sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation sensitive neural systems of direction selective neurons (DSNs) in fruit flies, bees and locusts, as well as the small target motion detectors (STMDs) in dragonflies and hover flies. We also review the applications of these models to robots and vehicles. Through these modelling studies, we summarise the methodologies that generate different direction and size selectivity in motion perception. At last, we discuss about multiple systems integration and hardware realisation of these bio-inspired motion perception models

    Constant Angular Velocity Regulation for Visually Guided Terrain Following

    Get PDF
    Insects use visual cues to control their flight behaviours. By estimating the angular velocity of the visual stimuli and regulating it to a constant value, honeybees can perform a terrain following task which keeps the certain height above the undulated ground. For mimicking this behaviour in a bio-plausible computation structure, this paper presents a new angular velocity decoding model based on the honeybee's behavioural experiments. The model consists of three parts, the texture estimation layer for spatial information extraction, the motion detection layer for temporal information extraction and the decoding layer combining information from pervious layers to estimate the angular velocity. Compared to previous methods on this field, the proposed model produces responses largely independent of the spatial frequency and contrast in grating experiments. The angular velocity based control scheme is proposed to implement the model into a bee simulated by the game engine Unity. The perfect terrain following above patterned ground and successfully flying over irregular textured terrain show its potential for micro unmanned aerial vehicles' terrain following

    Bio-inspired motion detection in an FPGA-based smart camera module

    Get PDF
    Köhler T, Roechter F, Lindemann JP, Möller R. Bio-inspired motion detection in an FPGA-based smart camera module. Bioinspiration & Biomimetics. 2009;4(1):015008.Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e. g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device
    corecore